Next: But what about existing unstructured data? That's next...
We've built a system to capture structured data going forward. We've also spent time looking at ways of structuring our data – turning bronze into silver – making it ready to work with AI.
But what about existing unstructured data? Can our AI tools help extract useful, strategic insights from:
Good news: AI tools can help extract structure from the mess. They can glean meaning from tangled documents.
Previously, when working with numerical data, 100% accuracy was required to generate meaningful insights, especially for financial reporting or forecasting. That data needed to be analytics-ready: correct, stable, and suitable for dashboards and reports.
This afternoon's data is text-based and more diverse in form and purpose. It is not worthless – there is gold hidden among the bronze. Important decisions, lost details in email chains, and useful trends often live here.
Manually reviewing large volumes of text would be extremely time-consuming.
This is where AI can step in.
This example demonstrates an advanced workflow used to compare an AI bootcamp curriculum with official UK government guidance on AI workforce skills.
The data sources included:
The goal: Identify alignment, find gaps, and create an actionable improvement roadmap.
Quick technical note: All sources were converted into a single data format: Markdown (.md).
Why this matters:
This is the bronze-to-silver data cleaning principle in action.
Rather than analysing everything at once, a chunked approach was used:
This approach reduces failure points and allows human check-ins.
Why chunking works: Each source gets proper attention, patterns emerge naturally, context limits are avoided, and understanding builds incrementally.
Each analysis followed a consistent structure:
Key Government Recommendations Alignment with Current Approach - Strong alignments - Partial alignments Gaps Identified Quick Wins
This made results quick to scan and easy to compare across sources.
The analysis showed strong alignment with government guidance, while also revealing opportunities for improvement and new material. Unexpected trends emerged, including common learner questions and recurring technical challenges.
This work would normally require:
Instead, it took a couple of hours with an AI coding assistant, structured prompts, and organised thinking—producing 16 analysis documents and a prioritised action plan.
Once the prompts were built, the AI worked autonomously with only periodic check-ins.
Same principles, simpler tools.
You do not need software engineering tools to do this. We will look at unstructured Word documents—summary emails sent to learners after past bootcamp sessions—and use browser-based AI tools to extract insight.
Live demo: ChatGPT, Gemini, Copilot, and NotebookLM.
NotebookLM’s Data Tables feature is purpose-built for turning unstructured data into reliable, queryable tables.
Using NotebookLM’s research tools, the platform can also gather data for you.
We will now carry out a short competitor research task and explore the material NotebookLM gathers.